984 research outputs found

    Presenting patient data in the electronic care record: the role of timelines

    Get PDF
    OBJECTIVE: To establish the current level of awareness and investigate the use of timelines within clinical computing systems as an organized display of the electronic patient record (EPR). DESIGN: Multicentre survey conducted using questionnaires and interview. SETTING: Seven UK hospitals and several general practice surgeries. PARTICIPANTS: A total of 120 healthcare professionals completed a questionnaire which directed structured interviews. Participants fell into two cohorts according to whether or not they had used clinical timelines, which gave 60 timeline users and 60 prospective timeline users. MAIN OUTCOME MEASURES: To investigate the awareness of timelines, and the potential benefits of timelines within clinical computing systems. RESULTS: Fifty-eight percent of participants had not heard of the specific term timelines despite 75% of users utilizing a form of timeline on a daily basis. The potential benefits of future timelines were clinical audit (95%CI 77.6-91.6), increased time efficiency (95%CI 77.7-91.6%), reduced clinical error (95%CI 71.0-86.7) and improved patient safety (95%CI 70.0-85.9). One continuous timeline view between primary and secondary care was considered to be of great potential benefit in allowing communication via a unified patient record. CONCLUSIONS: The concept of timelines has enjoyed proven success in healthcare in the USA and in other sectors worldwide. Clinicians are supportive of timelines in healthcare. Formal input from clinicians should be sought when designing and implementing computer systems in healthcare. Timelines in healthcare support clinicians cognitive processes by improving the amount of data available and improving the way in which data are presented

    Unravelling the rate of action of hits in the Leishmania donovani box using standard drugs amphotericin B and miltefosine

    Get PDF
    In recent years, the neglected diseases drug discovery community has elected phenotypic screening as the key approach for the identification of novel hit compounds. However, when this approach is applied, important questions related to the mode of action for these compounds remain unanswered. One of such questions is related to the rate of action, a useful piece of information when facing the challenge of prioritising the most promising hit compounds. In the present work, compounds of the "Leishmania donovani box" were evaluated using a rate of action assay adapted from a replicative intracellular high content assay recently developed. The potency of each compound was determined every 24 hours up to 96 hours, and standard drugs amphotericin B and miltefosine were used as references to group these compounds according to their rate of action. Independently of this biological assessment, compounds were also clustered according to their minimal chemical scaffold. Comparison of the results showed a complete correlation between the chemical scaffold and the biological group for the vast majority of compounds, demonstrating how the assay was able to bring information on the rate of action for each chemical series, a property directly linked to the mode of action. Overall, the assay here described permitted us to evaluate the rate of action of the "Leishmania donovani box" using two of the currently available drugs as references and, also, to propose a number of fast-acting chemical scaffolds present in the box as starting points for future drug discovery projects to the wider scientific community. The results here presented validate the use of this assay for the determination of the rate of action early in the discovery process, to assist in the prioritisation of hit compounds

    A Measurement of Rb using a Double Tagging Method

    Get PDF
    The fraction of Z to bbbar events in hadronic Z decays has been measured by the OPAL experiment using the data collected at LEP between 1992 and 1995. The Z to bbbar decays were tagged using displaced secondary vertices, and high momentum electrons and muons. Systematic uncertainties were reduced by measuring the b-tagging efficiency using a double tagging technique. Efficiency correlations between opposite hemispheres of an event are small, and are well understood through comparisons between real and simulated data samples. A value of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is statistical and the second systematic. The uncertainty on Rc, the fraction of Z to ccbar events in hadronic Z decays, is not included in the errors. The dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the deviation of Rc from the value 0.172 predicted by the Standard Model. The result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European Physical Journal

    Measurement of the B+ and B-0 lifetimes and search for CP(T) violation using reconstructed secondary vertices

    Get PDF
    The lifetimes of the B+ and B-0 mesons, and their ratio, have been measured in the OPAL experiment using 2.4 million hadronic Z(0) decays recorded at LEP. Z(0) --> b (b) over bar decays were tagged using displaced secondary vertices and high momentum electrons and muons. The lifetimes were then measured using well-reconstructed charged and neutral secondary vertices selected in this tagged data sample. The results aretau(B+) = 1.643 +/- 0.037 +/- 0.025 pstau(Bo) = 1.523 +/- 0.057 +/- 0.053 pstau(B+)/tau(Bo) = 1.079 +/- 0.064 +/- 0.041,where in each case the first error is statistical and the second systematic.A larger data sample of 3.1 million hadronic Z(o) decays has been used to search for CP and CPT violating effects by comparison of inclusive b and (b) over bar hadron decays, No evidence fur such effects is seen. The CP violation parameter Re(epsilon(B)) is measured to be Re(epsilon(B)) = 0.001 +/- 0.014 +/- 0.003and the fractional difference between b and (b) over bar hadron lifetimes is measured to(Delta tau/tau)(b) = tau(b hadron) - tau((b) over bar hadron)/tau(average) = -0.001 +/- 0.012 +/- 0.008

    Measurement of the top quark mass using the matrix element technique in dilepton final states

    Get PDF
    We present a measurement of the top quark mass in pp¯ collisions at a center-of-mass energy of 1.96 TeV at the Fermilab Tevatron collider. The data were collected by the D0 experiment corresponding to an integrated luminosity of 9.7  fb−1. The matrix element technique is applied to tt¯ events in the final state containing leptons (electrons or muons) with high transverse momenta and at least two jets. The calibration of the jet energy scale determined in the lepton+jets final state of tt¯ decays is applied to jet energies. This correction provides a substantial reduction in systematic uncertainties. We obtain a top quark mass of mt=173.93±1.84  GeV

    Large-Scale Sequence Analysis of Hemagglutinin of Influenza A Virus Identifies Conserved Regions Suitable for Targeting an Anti-Viral Response

    Get PDF
    BACKGROUND: Influenza A viral surface protein, hemagglutinin, is the major target of neutralizing antibody response and hence a main constituent of all vaccine formulations. But due to its marked evolutionary variability, vaccines have to be reformulated so as to include the hemagglutinin protein from the emerging new viral strain. With the constant fear of a pandemic, there is critical need for the development of anti-viral strategies that can provide wider protection against any Influenza A pathogen. An anti-viral approach that is directed against the conserved regions of the hemaggutinin protein has a potential to protect against any current and new Influenza A virus and provide a solution to this ever-present threat to public health. METHODOLOGY/PRINCIPAL FINDINGS: Influenza A human hemagglutinin protein sequences available in the NCBI database, corresponding to H1, H2, H3 and H5 subtypes, were used to identify highly invariable regions of the protein. Nine such regions were identified and analyzed for structural properties like surface exposure, hydrophilicity and residue type to evaluate their suitability for targeting an anti-peptide antibody/anti-viral response. CONCLUSION/SIGNIFICANCE: This study has identified nine conserved regions in the hemagglutinin protein, five of which have the structural characteristics suitable for an anti-viral/anti-peptide response. This is a critical step in the design of efficient anti-peptide antibodies as novel anti-viral agents against any Influenza A pathogen. In addition, these anti-peptide antibodies will provide broadly cross-reactive immunological reagents and aid the rapid development of vaccines against new and emerging Influenza A strains

    A systematic review of cooling for neuroprotection in neonates with hypoxic ischemic encephalopathy – are we there yet?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The objective of this study was to systematically review randomized trials assessing therapeutic hypothermia as a treatment for term neonates with hypoxic ischemic encephalopathy.</p> <p>Methods</p> <p>The Cochrane Central Register of Controlled Trials, MEDLINE, EMBASE, CINAHL databases, reference lists of identified studies, and proceedings of the Pediatric Academic Societies were searched in July 2006. Randomized trials assessing the effect of therapeutic hypothermia by either selective head cooling or whole body cooling in term neonates were eligible for inclusion in the meta-analysis. The primary outcome was death or neurodevelopmental disability at ≥ 18 months.</p> <p>Results</p> <p>Five trials involving 552 neonates were included in the analysis. Cooling techniques and the definition and severity of neurodevelopmental disability differed between studies. Overall, there is evidence of a significant effect of therapeutic hypothermia on the primary composite outcome of death or disability (RR: 0.78, 95% CI: 0.66, 0.92, NNT: 8, 95% CI: 5, 20) as well as on the single outcomes of mortality (RR: 0.75, 95% CI: 0.59, 0.96) and neurodevelopmental disability at 18 to 22 months (RR: 0.72, 95% CI: 0.53, 0.98). Adverse effects include benign sinus bradycardia (RR: 7.42, 95% CI: 2.52, 21.87) and thrombocytopenia (RR: 1.47, 95% CI: 1.07, 2.03, NNH: 8) without deleterious consequences.</p> <p>Conclusion</p> <p>In general, therapeutic hypothermia seems to have a beneficial effect on the outcome of term neonates with moderate to severe hypoxic ischemic encephalopathy. Despite the methodological differences between trials, wide confidence intervals, and the lack of follow-up data beyond the second year of life, the consistency of the results is encouraging. Further research is necessary to minimize the uncertainty regarding efficacy and safety of any specific technique of cooling for any specific population.</p

    Search for the standard model Higgs boson at LEP

    Get PDF

    AutoClickChem: Click Chemistry in Silico

    Get PDF
    Academic researchers and many in industry often lack the financial resources available to scientists working in “big pharma.” High costs include those associated with high-throughput screening and chemical synthesis. In order to address these challenges, many researchers have in part turned to alternate methodologies. Virtual screening, for example, often substitutes for high-throughput screening, and click chemistry ensures that chemical synthesis is fast, cheap, and comparatively easy. Though both in silico screening and click chemistry seek to make drug discovery more feasible, it is not yet routine to couple these two methodologies. We here present a novel computer algorithm, called AutoClickChem, capable of performing many click-chemistry reactions in silico. AutoClickChem can be used to produce large combinatorial libraries of compound models for use in virtual screens. As the compounds of these libraries are constructed according to the reactions of click chemistry, they can be easily synthesized for subsequent testing in biochemical assays. Additionally, in silico modeling of click-chemistry products may prove useful in rational drug design and drug optimization. AutoClickChem is based on the pymolecule toolbox, a framework that may facilitate the development of future python-based programs that require the manipulation of molecular models. Both the pymolecule toolbox and AutoClickChem are released under the GNU General Public License version 3 and are available for download from http://autoclickchem.ucsd.edu

    Non-Invasive Quantification of White and Brown Adipose Tissues and Liver Fat Content by Computed Tomography in Mice

    Get PDF
    OBJECTIVES: Obesity and its distribution pattern are important factors for the prediction of the onset of diabetes in humans. Since several mouse models are suitable to study the pathophysiology of type 2 diabetes the aim was to validate a novel computed tomograph model (Aloka-Hitachi LCT-200) for the quantification of visceral, subcutaneous, brown and intrahepatic fat depots in mice. METHODS: Different lean and obese mouse models (C57BL/6, B6.V-Lep(ob), NZO) were used to determine the most adequate scanning parameters for the detection of the different fat depots. The data were compared with those obtained after preparation and weighing the fat depots. Liver fat content was determined by biochemical analysis. RESULTS: The correlations between weights of fat tissues on scale and weights determined by CT were significant for subcutaneous (r(2) = 0.995), visceral (r(2) = 0.990) and total white adipose tissue (r(2) = 0.992). Moreover, scans in the abdominal region, between lumbar vertebrae L4 to L5 correlated with whole-body fat distribution allowing experimenters to reduce scanning time and animal exposure to radiation and anesthesia. Test-retest reliability and measurements conducted by different experimenters showed a high reproducibility in the obtained results. Intrahepatic fat content estimated by CT was linearly related to biochemical analysis (r(2) = 0.915). Furthermore, brown fat mass correlated well with weighted brown fat depots (r(2) = 0.952). In addition, short-term cold-expose (4 °C, 4 hours) led to alterations in brown adipose tissue attributed to a reduction in triglyceride content that can be visualized as an increase in Hounsfield units by CT imaging. CONCLUSION: The 3D imaging of fat by CT provides reliable results in the quantification of total, visceral, subcutaneous, brown and intrahepatic fat in mice. This non-invasive method allows the conduction of longitudinal studies of obesity in mice and therefore enables experimenters to investigate the onset of complex diseases such as diabetes and obesity
    corecore